#Angular.js Development Agency
Explore tagged Tumblr posts
Text
Web Design Companies in Qatar: Expert Guide to Costs & Quality [2025]
Qatar’s digital world boasts more than 175 web design companies that have delivered 179 successful projects and earned 120 client reviews. These companies offer services ranging from simple websites at QAR 5,000 ($1,370) to sophisticated solutions that can cost QAR 50,000 ($13,700) or more.
Website design companies in Qatar work with prestigious clients from real estate, hospitality, and finance sectors. Our team has analyzed the local web design market thoroughly to help you choose the right development partner. This complete guide shows you the costs, quality standards, and essential factors to think over while selecting Qatar’s ideal website design company for your project.
Qatar Web Design Market Overview
The Application Development Software market in Qatar will reach QAR 335.96 million by 2025. The digital transformation market should grow at an impressive CAGR of 16.43% between 2025–2030.
Current Market Size and Growth
Qatar’s web design sector stands as a vital component of the country’s digital ecosystem. The market shows reliable growth, especially when you have e-commerce and corporate sectors. The rising smartphone penetration rate has made web design companies focus on mobile-first development approaches.
Key Players and Specializations
The market features several prominent web design companies:
· Artisans Digital — specializing in UI/UX design and responsive development
· Make it WOW — focusing on custom website development
· Developer Bazaar Technologies — expert in mobile-friendly design
· Curve Design Qatar — specializing in marketing integration
· Daya — known for authentic brand development
These companies provide services ranging from QAR 5,000 for simple websites to QAR 100,000 for complex corporate solutions. Local web design agencies have got recognition through prestigious awards like the MENA Digital Awards.
Industry Standards and Certifications
Web design companies in Qatar follow strict quality standards and cultural considerations. Successful agencies must show proficiency in dual-language development that supports both Arabic and English interfaces. They must also comply with Qatar’s data protection laws and e-commerce guidelines.
Professional certifications are the foundations of the industry. The Certified Web Developer (CWD) certification, recognized throughout Qatar, confirms expertise in modern web development practices. Companies must also prove their competency in mobile optimization, responsive design, and local payment gateway integration.
Website Development Trends in Qatar
Qatar’s web development has grown by a lot, and companies now use state-of-the-art frameworks and technologies to create exceptional digital experiences.
Popular Design Frameworks
Web design companies in Qatar depend on several robust frameworks. React.js guides the digital world, especially in building dynamic single-page applications. Developers choose Angular.js over traditional approaches when they need scalability, making it the top choice for social networking and interactive platforms. Vue.js has become popular because it’s lightweight and perfect for progressive web apps with its compact 18–21 kilobyte size.
Mobile-First Development Practices
Smartphone usage has altered the map of development practices in Qatar. Progressive Web Apps (PWAs) are now the life-blood of technology that offers fast loading times and offline capabilities on devices of all types. These applications provide uninterrupted experiences and work well even when network conditions aren’t ideal. Accelerated Mobile Pages (AMP) technology has grown in importance because it boosts mobile loading speeds and user experience.
Emerging Technologies
Qatar’s web development scene shows rapid tech advancement. Companies now blend AI and machine learning into custom web solutions to create tailored experiences. AR and VR technologies shape website development’s future. Qatar’s web design companies also use AI-powered chatbots to improve customer service and create personalized user interactions.
Cloud infrastructure and data centers have made Qatar a digital innovation hub. Microsoft’s global datacenter region and Google Cloud’s Doha cloud region have contributed by a lot, and they could generate over QAR 65.50 billion in the coming years.
Local vs International Web Design Companies
Businesses looking for digital solutions must think about several factors when choosing between local and international web design companies in Qatar.
Comparative Advantages
Qatar’s local web design companies bring unique benefits through their deep grasp of regional market dynamics. These benefits include:
· A deep grasp of how local consumers behave
· Knowledge of Qatar’s business rules
· Strong ties to the local business community
· Swift support and maintenance response
· Clear view of what regional clients want in design
Communication and Cultural Factors
Website development in the Qatari market depends heavily on cultural awareness. Local web design companies have shown they excel at understanding Arabic design elements and cultural subtleties. International firms often struggle with cultural context, while local developers naturally create imagery and content that appeals to Qatari audiences.
Project Management Approaches
Local web design teams blend traditional and agile practices in their project management. This mix helps them adapt to local business needs while meeting international standards. Their teams include PMP-certified consultants who lead diverse working groups.
Being close to clients makes face-to-face meetings and immediate communication possible, which creates better teamwork during development. International firms might have better prices, but local companies respond faster and give more customized service because they work in the same time zone.
Qatar’s local web design companies have built an impressive track record with successful projects in businesses of all types. They know Qatar’s data protection laws and content rules, which helps websites stay compliant with local regulations. All the same, these companies keep up with global standards by using international best practices in their development work.
Success Stories and Case Studies
Qatar’s top web design companies have achieved remarkable results in a variety of sectors, from healthcare to industrial projects.
Notable Qatar Website Projects
Gulf Healthcare International exemplifies successful web implementation with its customized healthcare solutions throughout the Middle East. Qatar Polymer Industrial Company’s website redesign stands as another success story in industrial development. The company earned ISO 9001:2015 certification for its excellence.
ROI Analysis and Metrics
Web design projects in Qatar have shown impressive returns on investment. Recent case studies reveal that businesses earn QR 5 in revenue for every QR 1 they invest in digital marketing and web development. Here are the key success metrics:
· 30% increase in online sales through targeted campaigns
· 25% reduction in customer acquisition costs
· 500+ completed web projects by leading firms
Lessons Learned from Failed Projects
Studies of unsuccessful projects gave an explanation about what web design companies in Qatar should avoid. These are the main reasons projects fail:
Poor communication and frequent requirement changes cause most project setbacks. Many projects faced early challenges because of unclear scope definition and poor project management skills. Research shows that educational background substantially affects project success rates.
Today’s successful web design companies in Qatar make client communication and detailed project documentation their top priorities. Companies have enhanced their delivery processes and quality assurance measures through careful post-project analysis.
Conclusion
Qatar’s web design companies are pioneering digital transformation by combining local expertise with global standards. Our analysis shows these companies deliver exceptional returns — businesses earn QR 5 for every QR 1 they invest in digital development.
These local agencies succeed especially when you have their deep grasp of Qatar’s cultural subtleties and business rules. Their success stories span healthcare to industrial projects, and they consistently deliver results through mobile-first development and emerging technologies.
Quality stands supreme in Qatar’s web design industry. The most successful projects highlight the value of clear communication, detailed documentation, and compliance with local and international standards. The market will reach QAR 335.96 million by 2025, making Qatar an attractive hub for web development services.
Companies should pick partners who show cultural awareness, technical skills, and solid knowledge of the digital world. This strategy will give a website that looks great and generates real business outcomes.
0 notes
Text
In this article, I’ll be sharing the technical expertise and help you understand all that freaky on the first sight staff. There will be information about stacks, frameworks, programming languages etc., - everything that can help one understand aspects of developing tools that are commonly used in web development agencies.What is MEAN stack development?So, today we are going to talk about MEAN, not the adjective, but the abbreviation for the Javascript web development stack that consists of:MongoDB - database.Express.js - application server, used for the backend.Angular.js - web application framework, used for the frontend.Node.js - web server. The MEAN stack today is a modern and more flexible analog of LAMP (Linux, Apache, MySQL, PHP) web development stack. While LAMP requires special OS, MEAN can be developed on every OS - Windows, Mac, Linux.Technical features of MEAN technology stackMongoDB MongoDB is a document-oriented database specially created to store hierarchical data structures (documents) and is implemented using a NoSQL approach. This NoSQL approach represents a fundamental shift in the persistence strategy.The programmer spends less time writing SQL statements and more writing the map/reduce functions in JavaScript. This eliminates the huge layers of transformation logic since MongoDB initially produces the JavaScript Object Notation (JSON) format. As a result, writing of REST web services is extremely easy.Express.js Express.js is free and open-source web application framework that runs on top of Node.js. It is used for backend development. The huge step from LAMP is a transition from a traditional generation of pages on the server side to targeting single-page applications (SPA) on the client side. Express allows you to manage and route/generate pages on the server side, but other components of MEAN stack such as Angular.js turn it more into client side.Angular.jsAngular.js is frontend MVC (Model-View-Controller) framework in Javascript with an open-source software.The transition to SPA does not simply translate the MVC artifacts from the server to the client device. This is also a jump from the mentality of synchronism to the mentality of asynchronism, which is essentially event-driven. And perhaps, most importantly is the movement from page-oriented applications to component-oriented applications.Node.js JavaScript development platform for server-side. It transforms JavaScript from a highly specialized language to general-purpose language. Node.js replaces Apache from the LAMP stack. But Node.js is much more than just a web server. In fact, the finished application is not deployed on a separate web server; instead, the web server itself is included in the application and automatically installed as part of the MEAN stack. As a result, the deployment process is greatly simplified, since the required version of the web server is explicitly defined along with the remaining dependencies of the execution time.Why we prefer using the MEAN stack over LAMP?During the last 10 years of our existence, we have always been trying to improve our development process. We rejected ineffective, slow or narrow-oriented tools and worked hard to deliver only high-quality results. We tried using LAMP and actually we still use it in some projects, but the transition to MEAN stack development is where we are heading to. Why? Here is the answer: One programming language (JavaScript) - less confusion with syntax.On AngularJS it is convenient to make rich client applications that do not require reloading of pages (MVC, event model, routing, the possibility of creating components + this killer-feature: two-way data binding).On Node.js in conjunction with Express framework, it is good to do RESTful API backend. When the server sends only the data to JSON, and the client itself is engaged in the submission. This reduces the connectivity of the client and server + simplifies the creation of a mobile application in the future (it will come in handy with the same API).
Of course, the RESTful API can be done in PHP, but the PHP process will be created and terminated with each request, spending resources on initialization, while the process on Node.js hangs in memory all the time.Asynchrony. Even the fact that any mean stack example can be implemented two times faster because of asynchrony, make us believe, that is one of the best options. Node.js does not block the current process when accessing external resources (for example, the database). This allows, among other things, to execute several requests in parallel. Node.js has a cool NPM package manager and a cool package community. Under any task, be it PDF generation or Email sending, it's easy to find a package with a concise interface and add it to your application.So with these technologies, the MEAN stack developers can build applications with higher quality in less time so that they can deliver a quick result to the customer while fewer resources are used. The author of this article - Liza Kerimova, internet marketer at Artjoker, software development company, that specializes in web development. Our goal is to turn clients’ ideas only to excellent results!
0 notes
Text
Devstree UK Services is a leading AngularJS development company dedicated to building simple, sophisticated applications that inspire growth. Our experts have built trendsetting apps for versatile business needs and helped several enterprises stay ahead of the curve. Hire AngularJS developers from Devstree and smarten up your IT landscape with our robust app solutions.
0 notes
Link
Our skilled UI developers are always focused on delivering cutting edge solutions to our clients that align with their business strategy. We work closely with our clients to deliver high quality solutions. We have developed applications using frameworks like Angular.js and Bootstrap CSS.
Deliver exceptional experiences with our UI solution
UI developers in our team are dedicated to delivering solutions which look stunning and provide seamless, intuitive access for the end users. Once we receive website design in any format from our clients, the UI development team focuses on making it available to users. At KGN Technologies, we have professional UI development team which is callable of providing cost-effective, timely and top of that best-in-quality service to our clients.
We specialize in single page application and CMS based website development for mobile, tablet and desktop. We have built applications using frameworks like Angular.js and Bootstrap.
PROFESSIONAL UI DEVELOPMENT SERVICES
We deliver unique and powerful UI development solutions which align well with our clients requirements. At KGN Technologies, we work hard to create beautiful and functional applications while improving usability. We dig deep to find right solutions that work for our clients and we do that by creating user interface that are useful, usable and desirable.
KGN Technologies | A UI Development Firm
Our talented user interface developers can make using even the most complex website a breeze. With extensive experience of developing responsive, function website that looks exceptional and work beautifully o all devices, KGN Technologies can transform your concept into something extraordinary.
The visual appeal of your website or mobile app creates an immediate and lasting impression on your users. Targeting users on more emotional level through user-centered design, our UI developer can turn your concept into reality with super coding skills and our UI developers are able to elicit an emotional response in the user which is directly linked to their actions.
3 notes
·
View notes
Text
Electron JS Development Services Company India, Electron JS Development
A trusted and honoured Electron App Development company right at your doorstep. Electron is the perfect desktop app development framework. Use Astec's top Electron development services to create your desktop application.
Electron JS Development Company
Frontend Development
Angular.js Development
React.JS Development
Node.Js Development
Vue.js Development
Electron JS Development Services for startups. Pure web agency builds cross-platform desktop apps and provides Electron JS Consulting services. If you are looking to build an application that is super-fast, scalable, and real-time, Node.js is the answer. ElectronJS represents NodeJS application written in JavaScript with an additional layer to access native Mac, Windows, and Linux API. Build your amazing desktop apps with a Top Electron Development Company.
ICEEL at providing first solutions for web and app development, we provide the below services for long-term relationship:
• Satisfaction guaranteed.
• On-Time Delivery.
• Award-winning services.
• 24-7 customer support.
• Website Maintenance Support.
• Import Export Management Courses
#electronjs#electron js development company india#electronjs designer#electronjs development solutions#electronjs developer#electronjs developer india
1 note
·
View note
Photo

World Web Technology https://nearmeplus.com/listings/item/1032465 The advent of the internet has open doors to millions of new opportunities. With such high competition, you need to make an outstanding online presence for your business. In a nutshell, it’s all about delivering great customer experiences. And this task has its own challenges. We at World Web Technology Pvt. Ltd., help our customers overcome these challenges and help them connect with their audience in a better way. We are a full-service WordPress and WooCommerce agency assisting businesses to build their brand. Our services include web design, PHP Web Development, Codeigniter Development, open-source development, and WordPress development. Web & CMS Development, WordPress Web Development, PHP Web Development, Laravel Web Development, Web API Development, CakePHP Development, Yii Development, ASP.NET Web Development, E-Commerce Development, Magento Web Development, WooCommerce Development, WooCommerce Plugin Development, Designing Services, Website Design, WordPress Theme Design, Magento Theme Design, Landing Page Design, Responsive Web Design, Mobile App Development, Android App Development, iPhone App Development, Hybrid App Development, AR VR App Development, Wearable App Development, React Native App Development, Hire Dedicated Developers, Hire Android App Developer, Hire iPhone App Developer, Hire WordPress Developer, Hire WooCommerce Developer, Hire Python Developer, Hire ROR Developer, Hire Magento Developer, Hire QA Analyst, Hire PHP Developer, Hire Website Designer, Digital Marketing, SEO Services, PPC Services, SMM Services, Open Source Development, ROR Development, Python Development, Angular.js Development, Node.js Development, #nearmeplus #appdevelopment #webdevelopment https://www.instagram.com/p/CEG6EtEHcNG/?igshid=4ks90lf03qk6
1 note
·
View note
Text
Web & App Development Company
Mobile App Development Software Development Web Development Web Design Digital Marketing Agencies SEO IT Services Companies Magento Development Shopify Developers PPC Social Media Marketing Offshore Software Development Wordpress Development Node js Development Angular.js Development React.js Development Vue js Development eCommerce Development eCommerce App Development Android App Development Top Enterprise Mobile App Development Companies Google Adwords Facebook Marketing, Instagram Marketing PHP Development Managed IT Services Website Development IT Strategy Consulting Cross Platform App Development Wearable App Development Services Progressive Web App Development Open Source web development services NFT development services HIRE FULL STACK DEVELOPERS DRUPAL WEB DEVELOPMENT FLUTTER APPLICATION DEVELOPMENT REACT NATIVE APPLICATION DEVELOPMENT ON-DEMAND APP DEVELOPMENT COMPANY REAL ESTATE MOBILE APP DEVELOPMENT Food Delivery App and Website Development Finance & Insurance Software Development Education & E-learning Software Development Healthcare & Fitness Software Development TRAVELS & HOSPITALITY WEB-MOBILE APP DEVELOPMENT Complete Solutions For Media & Entertainment TRANSPORTATION & AUTOMOBILE WEB-MOBILE APP DEVELOPMENT Classified Application Development Complete Solutions For Event & Booking Apps Content Writing Social Media Optimization
0 notes
Text
Which coding will be best suitable for your business
There are 11 coding languages which will be best suitable for your business:-
1- Firstly Java:- Java is one of the oldest programming languages. Since the inception of Java, programming has evolved greatly. It is still widely used by dedicated software development companies. However, contrary to the belief that this programming language is mostly used in the IT field, it is also used in the business field. Many of Java's features make this particular language particularly suitable for commercial use.
2- Python:- Python continues to top the lists of most used and loved programming languages, even in 2020. Python offers a large variety of open-source libraries for data science, image recognition, and many more. It is widely used by web applications such as YouTube, Pinterest, and Instagram.
Free to use, easy to use, effective communication of Python with other languages and platforms, scalability, and extensibility are some of the reasons why Python is useful in your business tasks.
3- Kotlin:- Sure, Java was the official Android language, but Kotlin pushed it aside. In fact, it has reigned as Google's programming language of choice since 2017. In fact, not only does the tech giant offer extensive support for its favorite child - or rather, its language - but it also offers comprehensive benefits to developers who use Kotlin. Developers have a lot of special features when they take advantage of the language.
4- JavaScript:- Where there is Java, for sure, there will also be JavaScript. What sets JavaScript apart from other programming languages is that it is a front-end language. It is mostly used to create front-end interactive applications.
The thing that makes this language useful in business is the efficiency of running scripts both on the client and server side. It can be used to generate the content of a web page before it is actually transmitted to a web browser. Its speed, high-quality tuning, and frameworks are some of the advantages of JavaScript.
5- PHP:- PHP is an easy to use, server-side, open source programming language and one of the leading languages used in web development. There are many reasons why developers and companies choose PHP for IT solutions. The first is that PHP is flexible. What do we mean by this? PHP can be used in many platforms such as Microsoft, UNIX, Linux, etc. Almost all servers and databases support this language.
Another reason is that PHP is budget friendly. It does not require any fees or downloads because it is distributed under the General Public License. PHP is also a great choice for web hosting. This is why many agencies offer plans on their PHP-powered websites.
6- Angular JS- Angular.js, a JavaScript framework with a steep learning curve, is widely used by major companies to build one-page web applications on the go. AngularJS is one of the technologies that should be investigated and used for commercial purposes. It has various business advantages. So companies must pay attention to it these days.
Whether large or small, a company must take advantage of the latest innovations in today's rapidly evolving technology.
7- Flutter Mobile Application :- Flutter is a record-breaking framework for premium, high-performance, creativity, quality, and fidelity mobile app development. We have been the first to use Flutter since it was launched by Google. Unleash the power of mobility with cross-platform compatibility, native performance, and a rich user interface. Connect with our expert Flutter developers for flexibility and low development costs.
8- React JS:- React.JS is often mentioned next to web development frameworks, even though it is a UI development library. It is a set of pre-written code that can be used to build applications from scratch. According to Statista's report, this front-end programming library was the most used technology for web development in 2021 compared to other popular web development solutions. 40.14% of respondents reported using React.js for this purpose.
9- Node Js:- Node.js is an open source runtime environment that was first released in 2009. Early users of Node include companies such as IBM, PayPal, and Microsoft, who use Node for its speed and scalability capabilities. Node allows you to write JavaScript both on the server side and on the client side, so Node developers are usually well versed in both aspects, making the transition between client and server environments relatively easy.
10- Blockchain:- Blockchain Development is valuable to the entities doing business with each other. With distributed ledger technology, authorized participants can access the same information at the same time to improve efficiency, build trust, and eliminate friction. Blockchain also allows for rapid resolution of scale and scope, and many solutions can be adapted to perform multiple tasks across industries.
11- Salesforce :- Salesforce is among the oldest and most popular cloud-based customer relationship management (CRM) systems. The software service provider has a wide range of CRM products for sales, services, marketing, commerce, sustainability, safety, and experiences. In addition to its suite of needs-based software solutions, Salesforce offers custom packages for companies of every size to connect marketing, sales, commerce, service, and IT teams with a unified solution for every stage of their customer journeys.
0 notes
Text
Angular.js Development work from home job/internship at Tache Technologies Private Limited
Angular.js Development work from home job/internship at Tache Technologies Private Limited
Job title: Angular.js Development work from home job/internship at Tache Technologies Private Limited Company: Tache Technologies Private Limited Job description: professionals in India. It is an independent market research agency offering high-quality marketing research, analysis… in India and the Asia Pacific, by helping them to connect with the local populace and delivering a unique…
View On WordPress
0 notes
Text
Web Design Training With Career Scope In Nepal
Want to create your career in Web Designing then you're at the proper place. Here, you'll know everything about Web Design Training and Its career scope in Nepal. i might wish to start by defining it. What Is Web Design? Web design is the process that refers to the planning of internet sites that are displayed on the web. Web development is typically related to the user experience instead of software development. Designing websites for desktop browsers wont to be the first focus of web design, but since the mid-2010s, mobile and tablet design became more and more important. Top 13 Web Development Tools Visual Studio Code Sublime Text Sketch NPM JQuery Bootstrap Angular.JS Chrome DevTools Sass Grunt CodePen TypeScript GitHub
Web Design Career Scope In Nepal With the booming IT industry around the globe, the world’s most vital means of communication has become the web, and websites function as the backbone of the web. If you become an internet Developer in Nepal then you'll realize that web development is actually an exciting career option if developers choose technologies wisely. Therefore, web design features a batch of Career scope and offers good career prospects. Every company or institute needs an internet site , therefore the look for designers who can create professional websites is ever-present. In every field, from large companies to educational institutions, and from small businesses to non-public individuals, these professionals are needed. What qualifications and training are required to become an internet Designer? It is possible to enter the sector of web design after graduation from university or after leaving school. Employers advertising jobs for graduates are likely to need a digital media design degree or a related topic. A portfolio of your best web design work is required, no matter whether you've got a related degree. In order to become an internet designer, school leavers should keep an eye fixed out for web designer apprenticeships and will make sure that they need relevant work experience to talk about. What are the Key skills for web designers? Imagination Creativity Patience Attention to detail Analytical skills Communication skills Technical ability Excellent IT skills SEO knowledge Experience with programs like PhotoShop and InDesign
What do web designers do? The task of an internet designer is to make a website’s design and layout. It also can ask updating an existing site or creating a fresh one. It differs from the work of web developers, who turn website designs into reality or write code that tells how different parts of an internet site work together. Typically, there's some crossover between these two roles. The job involves the subsequent responsibilities: Designing website layouts Building sample sites Having a client meeting to debate project details and/or requirements Demonstrating and getting feedback on draft websites Staying up-to-date with the newest technology and software advances Developing knowledge and skills in appropriate software/programming languages, like HTML and JavaScript Producing products that are easy to use, effective, and appealing Editing and retouching digital images Being a member of a multidisciplinary team Extra hours could also be required to satisfy deadlines. Typical employers of web designers Software companies IT consultancies Specialist web design companies Large corporate organizations Any organization that uses computer systems A person with appropriate experience can often work as a freelancer or self-employed. Jobs are advertised online, through recruitment agencies, and thru career centres. Best Web Designing training institute in Nepal If you would like to create your career in Web Design and have an interest in becoming a knowledgeable web developer then the varsity of data Technologies offers the simplest Web Design training. which includes an internet designing course for learning website design skills, techniques, and gaining knowledge of Web Design. Click here for registration What is front Development training? Front End Development is the development process of the graphical interface of an internet site with the utilization of HTML, CSS, and Javascript in order that users can view the website also as interact with the website. What does a front Developer do? A front Developer use to style a responsive website using HTML, CSS, Javascript, and Bootstrap(framework) for client sites. The main responsibility of the front Developer is to form sure website visitors are ready to easily interact with the page. To do this, they use design, technology, and programming to style a website’s appearance and handle debugging. Any element of an internet site that you simply can see, click on, or otherwise interact with is the work of front-end development. Front End Development Scope and Salary in Nepal Front-end development career paths could also be the proper choice for somebody with a knack for aesthetics and an interest in coding. There is a high demand for front Developers in Nepal since most businesses within the country are expanding and they require websites to work. they have front Developers to develop websites. The most important thing about determining a salary is experience. Having more experience will end in a better salary. Nepal’s yearly average income for front Developers is 294,791 Nepalese rupees. The average salary includes housing, transportation, and other benefits. The salary of a front Developer varies greatly counting on experience, skills, gender, or location. Common Tasks Performed by a Front-End Developer A front developer’s responsibilities may vary slightly from company to company, but you'll generally expect them to incorporate the following: Improving the user experience. Bring concepts to life by using HTML, JavaScript, and CSS. Creating and maintaining the interface. Designing websites for mobile devices. Creating tools that improve site interaction regardless of what browser is getting used. Workflow management for software. Implementing SEO best practices. Testing for usability and fixing bugs.
Conclusion If you would like to create your career in Web Designing and have an interest in becoming a knowledgeable web developer and learning website design skills, techniques and knowledge then the varsity of data Technologies offers the simplest web design training which incorporates internet designing course syllabus for learning web design.
1 note
·
View note
Text
What Are The Skills Of Full-stack Developers
Full Stack Developer Dubai is an engineer who chips away at the software application’s both the software application’s the customer side and work. This sort of developer deals with the Full Stack of a software application significance Front end development, Back end development, Information base, Worker, Programming interface, and form controlling frameworks. Henceforth, the name "Full Stack" Developer.

Suppose you're utilizing your Instagram application. Each time you invigorate, new substance is stacked on your screen. You can like an image, add new ones, look for profiles and accomplish such a great deal more. Even though it gives a consistent client experience, there's a ton that goes on in the backend. HTTP demands are made to the Instagram workers to recover and stack data. This is worked with the assistance of backend structures.
Regularly every application comprises the front-end, the backend, and the data set.
A Full Stack Developer is related to the making of an application from its beginning to end. He plans the front-end and the backend of an application while guaranteeing its productivity, dependability, and other significant highlights.
Full-stack developer interprets client necessities into the general design and executes the new frameworks. A Full-Stack Developer doesn't dominate all innovations. Notwithstanding, the expert is relied upon to chip away at the customer just as worker sides and get what is happening when fostering an application. The person in question ought to have a real interest in all software innovations.
For what reason Do You Need a Full-Stack Developer?
Here are some unmistakable reasons why you should hire a full-stack development proficient:
A full-stack developer assists you with keeping all aspects of the framework moving along as expected
Full-stack developers can assist everybody in the group and enormously lessen the time and specialized expenses of group correspondence
If one individual assumes various parts, it saves your organization's faculty, framework, and functional expense
Full Stack Developer Abilities You Need to Know
Following is the Full Stack developer range of abilities:
1) Front-end innovation
The full-stack developers ought to be experts in fundamental front-end advancements like HTML5, CSS3, JavaScript. Information on outsider libraries like jQuery, LESS, Rakish and React JS is attractive
2) Development Dialects
The full-stack engineer should know no less than one worker side programming dialects like Java, Python, Ruby, Net, and so on
3) Information base and reserve
Information on different DBMS innovations is another significant need of a full-stack developer. MySQL, MongoDB, Prophet, SQL Server are generally utilized for this reason. Information on storing instruments like stain, Mem cached, Redis is an or more.
4) Essential plan capacity
To turn into a fruitful Full-Stack web developer, the information on planning is additionally recommended. Besides, the individual should know the guideline of essential model plan and UI/UX plan.
5) Worker
Openness to dealing with Apache or Nginx workers is alluring. A decent background in Linux helps tremendously in managing workers.
6) Form control framework (VCS)
A variant control framework permits full stack developers to monitor every one of the progressions made in the codebase. The information on Git assists full with stacking developers to see how to get the most recent code, update portions of the code, make changes in other developer's code without breaking things.
7) Working with Programming interface (REST and Cleanser):
Information on web administrations or Programming interfaces is likewise significant for full stack developers. Information on manifestations and utilization of REST and Cleanser administrations is alluring.
The digital marketing recruitment agency are hiring full stack developer Dubai, Cyber Security Dubai and hire Magento developers in UAE and have a better IT compliance management system.
Different Bits of the Riddle:
Capacity to compose quality unit tests
The person in question ought to have a total comprehension of mechanized cycles for building testing, archive, and conveying it at scale
A consciousness of safety concerns is significant, as each layer has its own weaknesses
Information on Calculations and information structures is additionally a fundamental requirement for proficient full-stack developers
Full Stack developer meaning: A full-stack web developer is an innovation master who can chip away at both the front-end and back-end of any application.
Full-Stack web developer assists you with keeping all aspects of the framework chugging along as expected.
Full Stack Developer abilities required are Front-end innovation, Development Dialects, Information base, Essential plan capacity, Worker, Working with Programming interface and form control frameworks.
A Java full-stack developer can construct entire Java applications including front end, back-end, information base, APIs, worker and form control. Java Full Stack developer abilities incorporate Center Java, servlets, APIs, information base, web engineering, and so forth
Software stack is an assortment of the projects which are utilized together to create a particular outcome.
Light represents Linux, Apache, MYSQL, and PHP.
Mern is a full type of MongoDB, Express, Respond, Node.js.
Mean represent MongoDB, Express, Angular.js and Node.js.
Full Stack Developer can procure up to $112000 each year.
The greatest fantasy about Full stack developers is that they are composing a wide range of code themselves which isn't correct.
#full stack developer dubai#IT compliance management#digital marketing recruitment agency#Cyber Security Dubai#hire magento developers
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
#túi_giấy_epacking_việt_nam #túi_giấy_epacking #in_túi_giấy_giá_rẻ #in_túi_giấy #epackingvietnam #tuigiayepacking
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes
Text
The Definitive Guide to JavaScript SEO (2021 Edition)
Posted by PierceBrelinsky
The web is in a golden age of front-end development, and JavaScript and technical SEO are experiencing a renaissance. As a technical SEO specialist and web dev enthusiast at an award-winning digital marketing agency, I’d like to share my perspective on modern JavaScript SEO based on industry best practices and my own agency experience. In this article, you'll learn how to optimize your JS-powered website for search in 2021.
What is JavaScript SEO?
JavaScript SEO is the discipline of technical SEO that’s focused on optimizing websites built with JavaScript for visibility by search engines. It’s primarily concerned with:
Optimizing content injected via JavaScript for crawling, rendering, and indexing by search engines.
Preventing, diagnosing, and troubleshooting ranking issues for websites and SPAs (Single Page Applications) built on JavaScript frameworks, such as React, Angular, and Vue.
Ensuring that web pages are discoverable by search engines through linking best practices.
Improving page load times for pages parsing and executing JS code for a streamlined User Experience (UX).
Is JavaScript good or bad for SEO?
It depends! JavaScript is essential to the modern web and makes building websites scalable and easier to maintain. However, certain implementations of JavaScript can be detrimental to search engine visibility.
How does JavaScript affect SEO?
JavaScript can affect the following on-page elements and ranking factors that are important for SEO:
Rendered content
Links
Lazy-loaded images
Page load times
Meta data
What are JavaScript-powered websites?
When we talk about sites that are built on JavaScript, we’re not referring to simply adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). In this case, JavaScript-powered websites refer to when the core or primary content is injected into the DOM via JavaScript.
App Shell Model.
This template is called an app shell and is the foundation for progressive web applications (PWAs). We’ll explore this next.
How to check if a site is built with JavaScript
You can quickly check if a website is built on a JavaScript framework by using a technology look-up tool such as BuiltWith or Wappalyzer. You can also “Inspect Element” or “View Source” in the browser to check for JS code. Popular JavaScript frameworks that you might find include:
Angular by Google
React by Facebook
Vue by Evan You
JavaScript SEO for core content
Here’s an example: Modern web apps are being built on JavaScript frameworks, like Angular, React, and Vue. JavaScript frameworks allow developers to quickly build and scale interactive web applications. Let’s take a look at the default project template for Angular.js, a popular framework produced by Google.
When viewed in the browser, this looks like a typical web page. We can see text, images, and links. However, let’s dive deeper and take a peek under the hood at the code:
Now we can see that this HTML document is almost completely devoid of any content. There are only the app-root and a few script tags in the body of the page. This is because the main content of this single page application is dynamically injected into the DOM via JavaScript. In other words, this app depends on JS to load key on-page content!
Potential SEO issues: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If search engines aren’t able to fully crawl all of your content, then your website could be overlooked in favor of competitors. We’ll discuss this in more detail later.
JavaScript SEO for internal links
Besides dynamically injecting content into the DOM, JavaScript can also affect the crawlability of links. Google discovers new pages by crawling links it finds on pages.
As a best practice, Google specifically recommends linking pages using HTML anchor tags with href attributes, as well as including descriptive anchor texts for the hyperlinks:
However, Google also recommends that developers not rely on other HTML elements — like div or span — or JS event handlers for links. These are called “pseudo” links, and they will typically not be crawled, according to official Google guidelines:
Despite these guidelines, an independent, third-party study has suggested that Googlebot may be able to crawl JavaScript links. Nonetheless, in my experience, I’ve found that it’s a best practice to keep links as static HTML elements.
Potential SEO issues: If search engines aren’t able to crawl and follow links to your key pages, then your pages could be missing out on valuable internal links pointing to them. Internal links help search engines crawl your website more efficiently and highlight the most important pages. The worst-case scenario is that if your internal links are implemented incorrectly, then Google may have a hard time discovering your new pages at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
JavaScript can also affect the crawlability of images that are lazy-loaded. Here’s a basic example. This code snippet is for lazy-loading images in the DOM via JavaScript:
Googlebot supports lazy-loading, but it does not “scroll” like a human user would when visiting your web pages. Instead, Googlebot simply resizes its virtual viewport to be longer when crawling web content. Therefore, the “scroll” event listener is never triggered and the content is never rendered by the crawler.
Here’s an example of more SEO-friendly code:
This code shows that the IntersectionObserver API triggers a callback when any observed element becomes visible. It’s more flexible and robust than the on-scroll event listener and is supported by modern Googlebot. This code works because of how Googlebot resizes its viewport in order to “see” your content (see below).
You can also use native lazy-loading in the browser. This is supported by Google Chrome, but note that it is still an experimental feature. Worst case scenario, it will just get ignored by Googlebot, and all images will load anyway:
Native lazy-loading in Google Chrome.
Potential SEO issues: Similar to core content not being loaded, it’s important to make sure that Google is able to “see” all of the content on a page, including images. For example, on an e-commerce site with multiple rows of product listings, lazy-loading images can provide a faster user experience for both users and bots!
Javascript SEO for page speed
Javascript can also affect page load times, which is an official ranking factor in Google’s mobile-first index. This means that a slow page could potentially harm rankings in search. How can we help developers mitigate this?
Minifying JavaScript
Deferring non-critical JS until after the main content is rendered in the DOM
Inlining critical JS
Serving JS in smaller payloads
Potential SEO issues: A slow website creates a poor user experience for everyone, even search engines. Google itself defers loading JavaScript to save resources, so it’s important to make sure that any served to clients is coded and delivered efficiently to help safeguard rankings.
JavaScript SEO for meta data
Also, it’s important to note that SPAs that utilize a router package like react-router or vue-router have to take some extra steps to handle things like changing meta tags when navigating between router views. This is usually handled with a Node.js package like vue-meta or react-meta-tags.
What are router views? Here’s how linking to different “pages” in a Single Page Application works in React in five steps:
When a user visits a React website, a GET request is sent to the server for the ./index.html file.
The server then sends the index.html page to the client, containing the scripts to launch React and React Router.
The web application is then loaded on the client-side.
If a user clicks on a link to go on a new page (/example), a request is sent to the server for the new URL.
React Router intercepts the request before it reaches the server and handles the change of page itself. This is done by locally updating the rendered React components and changing the URL client-side.
In other words, when users or bots follow links to URLs on a React website, they are not being served multiple static HTML files. But rather, the React components (like headers, footers, and body content) hosted on root ./index.html file are simply being reorganized to display different content. This is why they’re called Single Page Applications!
Potential SEO issues: So, it’s important to use a package like React Helmet for making sure that users are being served unique metadata for each page, or “view,” when browsing SPAs. Otherwise, search engines may be crawling the same metadata for every page, or worse, none at all!
How does this all affect SEO in the bigger picture? Next, we need to learn how Google processes JavaScript.
How does Google handle JavaScript?
In order to understand how JavaScript affects SEO, we need to understand what exactly happens when GoogleBot crawls a web page:
Crawl
Render
Index
First, Googlebot crawls the URLs in its queue, page by page. The crawler makes a GET request to the server, typically using a mobile user-agent, and then the server sends the HTML document.
Then, Google decides what resources are necessary to render the main content of the page. Usually, this means only the static HTML is crawled, and not any linked CSS or JS files. Why?
According to Google Webmasters, Googlebot has discovered approximately 130 trillion web pages. Rendering JavaScript at scale can be costly. The sheer computing power required to download, parse, and execute JavaScript in bulk is massive.
This is why Google may defer rendering JavaScript until later. Any unexecuted resources are queued to be processed by Google Web Rendering Services (WRS), as computing resources become available.
Finally, Google will index any rendered HTML after JavaScript is executed.
Google crawl, render, and index process.
In other words, Google crawls and indexes content in two waves:
The first wave of indexing, or the instant crawling of the static HTML sent by the webserver
The second wave of indexing, or the deferred crawling of any additional content rendered via JavaScript
Google wave indexing. Source: Google I/O'18
The bottom line is that content dependent on JS to be rendered can experience a delay in crawling and indexing by Google. This used to take days or even weeks. For example, Googlebot historically ran on the outdated Chrome 41 rendering engine. However, they’ve significantly improved its web crawlers in recent years.
Googlebot was recently upgraded to the latest stable release of the Chromium headless browser in May 2019. This means that their web crawler is now “evergreen” and fully compatible with ECMAScript 6 (ES6) and higher, or the latest versions of JavaScript.
So, if Googlebot can technically run JavaScript now, why are we still worried about indexing issues?
The short answer is crawl budget. This is the concept that Google has a rate limit on how frequently they can crawl a given website because of limited computing resources. We already know that Google defers JavaScript to be executed later to save crawl budget.
While the delay between crawling and rendering has been reduced, there is no guarantee that Google will actually execute the JavaScript code waiting in line in its Web Rendering Services queue.
Here are some reasons why Google might not actually ever run your JavaScript code:
Blocked in robots.txt
Timeouts
Errors
Therefore, JavaScript can cause SEO issues when core content relies on JavaScript but is not rendered by Google.
Real-world application: JavaScript SEO for e-commerce
E-commerce websites are a real-life example of dynamic content that is injected via JavaScript. For example, online stores commonly load products onto category pages via JavaScript.
JavaScript can allow e-commerce websites to update products on their category pages dynamically. This makes sense because their inventory is in a constant state of flux due to sales. However, is Google actually able to “see” your content if it does not execute your JS files?
For e-commerce websites, which depend on online conversions, not having their products indexed by Google could be disastrous.
How to test and debug JavaScript SEO issues
Here are steps you can take today to proactively diagnose any potential JavaScript SEO issues:
Visualize the page with Google’s Webmaster Tools. This helps you to view the page from Google’s perspective.
Use the site search operator to check Google’s index. Make sure that all of your JavaScript content is being indexed properly by manually checking Google.
Debug using Chrome’s built-in dev tools. Compare and contrast what Google “sees” (source code) with what users see (rendered code) and ensure that they align in general.
There are also handy third-party tools and plugins that you can use. We’ll talk about these soon.
Google Webmaster Tools
The best way to determine if Google is experiencing technical difficulties when attempting to render your pages is to test your pages using Google Webmaster tools, such as:
URL Inspection tool in Search Console
Mobile-Friendly Test
Google Mobile-Friendly Test.
The goal is simply to visually compare and contrast your content visible in your browser and look for any discrepancies in what is being displayed in the tools.
Both of these Google Webmaster tools use the same evergreen Chromium rendering engine as Google. This means that they can give you an accurate visual representation of what Googlebot actually “sees” when it crawls your website.
There are also third-party technical SEO tools, like Merkle’s fetch and render tool. Unlike Google’s tools, this web application actually gives users a full-size screenshot of the entire page.
Site: Search Operator
Alternatively, if you are unsure if JavaScript content is being indexed by Google, you can perform a quick check-up by using the site: search operator on Google.
Copy and paste any content that you’re not sure that Google is indexing after the site: operator and your domain name, and then press the return key. If you can find your page in the search results, then no worries! Google can crawl, render, and index your content just fine. If not, it means your JavaScript content might need some help gaining visibility.
Here’s what this looks like in the Google SERP:
Chrome Dev Tools
Another method you can use to test and debug JavaScript SEO issues is the built-in functionality of the developer tools available in the Chrome web browser.
Right-click anywhere on a web page to display the options menu and then click “View Source” to see the static HTML document in a new tab.
You can also click “Inspect Element” after right-clicking to view the content that is actually loaded in the DOM, including JavaScript.
Inspect Element.
Compare and contrast these two perspectives to see if any core content is only loaded in the DOM, but not hard-coded in the source. There are also third-party Chrome extensions that can help do this, like the Web Developer plugin by Chris Pederick or the View Rendered Source plugin by Jon Hogg.
How to fix JavaScript rendering issues
After diagnosing a JavaScript rendering problem, how do you resolve JavaScript SEO issues? The answer is simple: Universal Javascript, also known as “Isomorphic” JavaScript.
What does this mean? Universal or Isomorphic here refers to JavaScript applications that are capable of being run on either the server or the client.
There are a few different implementations of JavaScript that are more search-friendly than client-side rendering, to avoid offloading JS to both users and crawlers:
Server-side rendering (SSR). This means that JS is executed on the server for each request. One way to implement SSR is with a Node.js library like Puppeteer. However, this can put a lot of strain on the server.
Hybrid rendering. This is a combination of both server-side and client-side rendering. Core content is rendered server-side before being sent to the client. Any additional resources are offloaded to the client.
Dynamic rendering. In this workaround, the server detects the user agent of the client making the request. It can then send pre-rendered JavaScript content to search engines, for example. Any other user agents will need to render their content client-side. For example, Google Webmasters recommend a popular open-source solution called Renderton for implementing dynamic rendering.
Incremental Static Regeneration, or updating static content after a site has already been deployed. This can be done with frameworks like Next.js for React or Nuxt.js for Vue. These frameworks have a build process that will pre-render every page of your JS application to static assets that you can serve from something like an S3 bucket. This way, your site can get all of the SEO benefits of server-side rendering, without the server management!
Each of these solutions helps make sure that, when search engine bots make requests to crawl HTML documents, they receive the fully rendered versions of the web pages. However, some of these can be extremely difficult or even impossible to implement after web infrastructure is already built. That’s why it’s important to keep JavaScript SEO best practices in mind when designing the architecture of your next web application.
Note, for websites built on a content management system (CMS) that already pre-renders most content, like WordPress or Shopify, this isn’t typically an issue.
Key takeaways
This guide provides some general best practices and insights into JavaScript SEO. However, JavaScript SEO is a complex and nuanced field of study. We recommend that you read through Google’s official documentation and troubleshooting guide for more JavaScript SEO basics. Interested in learning more about optimizing your JavaScript website for search? Leave a comment below.
The web has moved from plain HTML - as an SEO you can embrace that. Learn from JS devs & share SEO knowledge with them. JS's not going away.
— ???? John ???? (@JohnMu) August 8, 2017
Want to learn more about technical SEO? Check out the Moz Academy Technical SEO Certification Series, an in-depth training series that hones in on the nuts and bolts of technical SEO.
Sign Me Up!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
0 notes